Relation-Aware Language-Graph Transformer for Question Answering

نویسندگان

چکیده

Question Answering (QA) is a task that entails reasoning over natural language contexts, and many relevant works augment models (LMs) with graph neural networks (GNNs) to encode the Knowledge Graph (KG) information. However, most existing GNN-based modules for QA do not take advantage of rich relational information KGs depend on limited interaction between LM KG. To address these issues, we propose Transformer (QAT), which designed jointly reason graphs respect entity relations in unified manner. Specifically, QAT constructs Meta-Path tokens, learn relation-centric embeddings based diverse structural semantic relations. Then, our Relation-Aware Self-Attention module comprehensively integrates different modalities via Cross-Modal Relative Position Bias, guides exchange entities modalities. We validate effectiveness commonsense question answering datasets like CommonsenseQA OpenBookQA, medical dataset, MedQA-USMLE. On all datasets, method achieves state-of-the-art performance. Our code available at http://github.com/mlvlab/QAT.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning of Graph Rules for Question Answering

AnswerFinder is a framework for the development of question-answering systems. AnswerFinder is currently being used to test the applicability of graph representations for the detection and extraction of answers. In this paper we briefly describe AnswerFinder and introduce our method to learn graph patterns that link questions with their corresponding answers in arbitrary sentences. The method i...

متن کامل

Semantic Parsing for Single-Relation Question Answering

We develop a semantic parsing framework based on semantic similarity for open domain question answering (QA). We focus on single-relation questions and decompose each question into an entity mention and a relation pattern. Using convolutional neural network models, we measure the similarity of entity mentions with entities in the knowledge base (KB) and the similarity of relation patterns and r...

متن کامل

Exploring Syntactic Relation Patterns for Question Answering

In this paper, we explore the syntactic relation patterns for opendomain factoid question answering. We propose a pattern extraction method to extract the various relations between the proper answers and different types of question words, including target words, head words, subject words and verbs, from syntactic trees. We further propose a QA-specific tree kernel to partially match the syntact...

متن کامل

Language Independent Passage Retrieval for Question Answering

Passage Retrieval (PR) is typically used as the first step in current Question Answering (QA) systems. Most methods are based on the vector space model allowing the finding of relevant passages for general user needs, but failing on selecting pertinent passages for specific user questions. This paper describes a simple PR method specially suited for the QA task. This method considers the struct...

متن کامل

Investigating Embedded Question Reuse in Question Answering

The investigation presented in this paper is a novel method in question answering (QA) that enables a QA system to gain performance through reuse of information in the answer to one question to answer another related question. Our analysis shows that a pair of question in a general open domain QA can have embedding relation through their mentions of noun phrase expressions. We present methods f...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i11.26578